Skip to content

Conversation

@heimoshuiyu
Copy link

TLDR

Add a configurable DEFAULT_TOKEN_LIMIT environment variable, allowing users to customize the model context length (e.g. DEFAULT_TOKEN_LIMIT=256K)

Dive Deeper

Currently hard-coded to 1M tokens, this change enables configurable context length limits.

Reviewer Test Plan

Set DEFAULT_TOKEN_LIMIT to a low level (e.g. 4K) and observe the context usage indicator decreasing rapidly during usage.

Testing Matrix

🍏 🪟 🐧
npm run
npx
Docker
Podman - -
Seatbelt - -

Linked issues / bugs

None

@zhutao100
Copy link
Contributor

You might be interested in #542

@Mingholy
Copy link
Collaborator

Mingholy commented Sep 9, 2025

A process.env support is still needed if you're interested after #542 is merged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants